Dyadic and small group collaboration is an evolutionary advantageous behaviour and the need for such collaboration is a regular occurrence in day to day life. In this paper we estimate the perceived personality traits of individuals in dyadic and small groups over thin-slices of interaction on four multimodal datasets. We find that our transformer based predictive model performs similarly to human annotators tasked with predicting the perceived big-five personality traits of participants. Using this model we analyse the estimated perceived personality traits of individuals performing tasks in small groups and dyads. Permutation analysis shows that in the case of small groups undergoing collaborative tasks, the perceived personality of group members clusters, this is also observed for dyads in a collaborative problem solving task, but not in dyads under non-collaborative task settings. Additionally, we find that the group level average perceived personality traits provide a better predictor of group performance than the group level average self-reported personality traits.
translated by 谷歌翻译
背景:以自我为中心的视频已成为监测社区中四肢瘫痪者的手部功能的潜在解决方案,尤其是因为它在家庭环境中检测功能使用的能力。目的:开发和验证一个基于可穿戴视力的系统,以测量四肢植物患者的家庭使用。方法:开发并比较了几种用于检测功能手动相互作用的深度学习算法。最精确的算法用于从20名参与者在家庭中记录的65小时的无脚本视频中提取手部功能的度量。这些措施是:总记录时间(PERC)的交互时间百分比;单个相互作用的平均持续时间(DUR);每小时互动数(NUM)。为了证明技术的临床有效性,以验证的措施与经过验证的手部功能和独立性的临床评估相关(逐渐定义了强度,敏感性和预性的评估 - GRASSP,上肢运动评分 - UEM和脊髓独立措施 - SICIM- SICIM- SICIM) 。结果:手动相互作用以0.80(0.67-0.87)的中位数得分自动检测到手动相互作用。我们的结果表明,较高的UEM和更好的预性与花费更长的时间相互作用有关,而较高的cim和更好的手动感觉会导致在以eg中心的视频记录期间进行的更多相互作用。结论:第一次,在四肢瘫痪者中,在不受约束的环境中自动估计的手部功能的度量已得到了国际接受的手部功能量度的验证。未来的工作将需要对基于以自我为中心的手工使用的绩效指标的可靠性和响应能力进行正式评估。
translated by 谷歌翻译